Skip to content
This repository has been archived by the owner on Jan 7, 2025. It is now read-only.

Add litellm #450

Closed
wants to merge 4 commits into from
Closed

Add litellm #450

wants to merge 4 commits into from

Conversation

PCSwingle
Copy link
Member

TODO: Add documentation saying to check litellm to see how to use other models.

Pull Request Checklist

  • Documentation has been updated, or this change doesn't require that

@jakethekoenig
Copy link
Member

I'm having some trouble getting it to work with ollama. We need to set litellm.drop_params to true because the ollama api doesn't support response_format. And we also need to pass in an api_base to the litellm completions. After I did that though I got the following exception which may be a bug with litellm itself? Not sure.

Traceback (most recent call last):
  File "/Users/jakekoenig/mentat/.venv/lib/python3.11/site-packages/litellm/llms/ollama.py", line 260, in ollama_async_streaming
    status_code=response.status_code, message=response.text
                                              ^^^^^^^^^^^^^
  File "/Users/jakekoenig/mentat/.venv/lib/python3.11/site-packages/httpx/_models.py", line 574, in text
    content = self.content
              ^^^^^^^^^^^^
  File "/Users/jakekoenig/mentat/.venv/lib/python3.11/site-packages/httpx/_models.py", line 568, in content
    raise ResponseNotRead()
httpx.ResponseNotRead: Attempted to access streaming response content, without having called `read()`.

@PCSwingle
Copy link
Member Author

Replaced by #451.

@PCSwingle PCSwingle closed this Jan 4, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants